Skip to content

Model pruning #21561

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 18 commits into from
Closed

Model pruning #21561

wants to merge 18 commits into from

Conversation

pctablet505
Copy link
Collaborator

Not ready for review or merge

pctablet505 and others added 15 commits August 5, 2025 15:35
Refactored the Keras pruning implementation to use a modular design with a new PruningConfig class, core pruning utilities, and improved callback and model APIs. Moved core logic to keras/src/pruning/core.py, added keras/src/pruning/config.py for configuration, and updated callbacks and Model.prune() to use the new config-driven approach. Added public API exports, improved test coverage with test_refactored_pruning.py, and removed duplicated logic from callbacks and model code.
Refactored pruning schedule classes to support extensible schedules (ConstantSparsity, PolynomialDecay, LinearDecay) and improved their API. Updated PruningConfig to accept either a schedule string or a PruningSchedule instance, and removed the deprecated pruning_policy.py. Updated __init__.py to reflect new public API.
Introduces new pruning methods (L1Pruning, LnPruning, SaliencyPruning, TaylorPruning) and updates the pruning API to support them. Refactors method selection logic, expands PruningConfig validation, and updates core pruning logic to handle both string and class-based methods. Adds comprehensive tests for advanced pruning methods in test_advanced_pruning.py and updates __init__.py exports. Improves documentation and error handling throughout the pruning modules.
Extended pruning configuration and core logic to support advanced pruning methods such as saliency and Taylor pruning. Added new parameters (dataset, loss_fn, n) to PruningConfig and updated method signatures to pass necessary context. Implemented fallback to magnitude pruning if required data is unavailable. Updated PruningMethod interface and relevant subclasses to accept additional arguments for gradient-based pruning.
Introduces a new direct-parameter pruning API supporting selective layer pruning via names and regex patterns, with detailed reporting and analysis utilities. Updates callbacks and model methods to use the new interface, deprecates config-based usage, and adds comprehensive examples and utility functions for sparsity/inference benchmarking and comparison. Enhances core pruning logic to support flexible layer selection and reporting.
Eliminates the PruningConfig class and all related legacy config-based code from the pruning API, callbacks, and model methods. Updates error messages and documentation to reflect the removal. Cleans up imports and removes tests that depended on PruningConfig.
Introduces get_pruning_mask and get_inverted_pruning_mask utilities for advanced pruning workflows, such as continual learning and selective weight freezing. Adds a 'reinitialize' option to pruning methods, enabling pruned weights to be reinitialized instead of zeroed, supporting the 'Pruning-then-Expanding' paradigm. Refactors pruning logic for better modularity and error handling, and updates public API exports accordingly. Removes redundant final sparsity print in PruningCallback.
Refactored pruning core utilities for better modularity and error handling, including extraction of kernel weight checks and pruning method instantiation. Improved gradient-based pruning method validation and centralized pruning statistics reporting. Pruning schedule classes now share validation and progress logic, reducing code duplication. Removed unused demo and utility files from utils.
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @pctablet505, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a comprehensive and user-friendly model pruning API to Keras. My goal is to simplify the process of applying sparsity to neural networks by moving to a direct parameter interface, enhancing layer selection capabilities, and providing robust tools for analyzing the effects of pruning.

Highlights

  • Simplified Pruning API: I've refactored the model.prune() method to accept direct parameters (e.g., sparsity, method) instead of requiring a PruningConfig object, making it more intuitive to use.
  • Flexible Layer Selection: The new API allows for precise control over which layers are pruned, supporting selection by exact layer names or using regex patterns. This applies to both the model.prune() method and the new pruning callbacks.
  • New Pruning Callbacks: I've added PruningCallback for gradual pruning during training and PostTrainingPruning for one-shot pruning after training, both leveraging the simplified parameter interface and layer selection.
  • Advanced Pruning Methods: Beyond magnitude-based pruning (L1, L2, structured), I've introduced gradient-based methods like SaliencyPruning and TaylorPruning to identify and remove less important weights based on their impact on the model's output.
  • Comprehensive Analysis and Benchmarking Tools: A suite of utility functions has been added to keras.src.pruning.pruning_utils, enabling detailed analysis of model sparsity, comparison of sparsity between models, and benchmarking of inference performance before and after pruning.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments or fill out our survey to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a comprehensive model pruning API to Keras, including various pruning methods, analysis utilities, and callbacks. The core implementation of the new model.prune() method and the underlying pruning logic appears well-designed and robust. However, the accompanying example files and tests have several critical issues that need to be addressed. These include incorrect API usage, bugs in the example model definitions, and non-portable hardcoded paths in test files. My review focuses on fixing these issues to ensure the examples and tests are correct and functional, which will be crucial for validating the new features and for user adoption.

# Also create a more complex model with conv layers
inputs = keras.Input(shape=(28, 28, 1), name='input')
x = keras.layers.Conv2D(32, 3, activation='relu', name='conv2d_1')(inputs)
x = keras.layers.Conv2D(64, 3, activation='relu', name='conv2d_2')(inputs)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The second Conv2D layer is incorrectly connected to the original inputs tensor instead of the output of the first Conv2D layer (x). This causes the first convolutional layer (conv2d_1) to be a dead-end branch in the computation graph, meaning it does not contribute to the model's output.

Suggested change
x = keras.layers.Conv2D(64, 3, activation='relu', name='conv2d_2')(inputs)
x = keras.layers.Conv2D(64, 3, activation='relu', name='conv2d_2')(x)

Comment on lines +81 to +99
if method_type in ["saliency", "taylor"]:
config = PruningConfig(
sparsity=0.5, # 50% sparsity
method=method_type,
dataset=dataset,
loss_fn=model.loss
)
else:
config = PruningConfig(
sparsity=0.5,
method=method_type
)

# Clone and prune model
pruned_model = keras.models.clone_model(model)
pruned_model.set_weights(model.get_weights())

try:
stats = pruned_model.prune(config)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

This example appears to be using an outdated or incorrect API. The model.prune() method does not accept a PruningConfig object. Instead, it expects pruning parameters like sparsity and method to be passed as direct keyword arguments. This code will raise a TypeError and needs to be updated to reflect the new API design.

Suggested change
if method_type in ["saliency", "taylor"]:
config = PruningConfig(
sparsity=0.5, # 50% sparsity
method=method_type,
dataset=dataset,
loss_fn=model.loss
)
else:
config = PruningConfig(
sparsity=0.5,
method=method_type
)
# Clone and prune model
pruned_model = keras.models.clone_model(model)
pruned_model.set_weights(model.get_weights())
try:
stats = pruned_model.prune(config)
if method_type in ["saliency", "taylor"]:
prune_kwargs = {
"dataset": dataset,
"loss_fn": model.loss
}
else:
prune_kwargs = {}
# Clone and prune model
pruned_model = keras.models.clone_model(model)
pruned_model.set_weights(model.get_weights())
try:
stats = pruned_model.prune(
sparsity=0.5,
method=method_type,
**prune_kwargs
)

Comment on lines +82 to +88
pruning_callback = pruning.PruningCallback(
target_sparsity=0.7,
start_step=2,
end_step=8,
frequency=2,
verbose=False,
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The PruningCallback is being initialized with a target_sparsity argument, but the correct parameter name is sparsity. This will cause a TypeError when the test is run.

Suggested change
pruning_callback = pruning.PruningCallback(
target_sparsity=0.7,
start_step=2,
end_step=8,
frequency=2,
verbose=False,
)
pruning_callback = pruning.PruningCallback(
sparsity=0.7,
start_step=2,
end_step=8,
frequency=2,
verbose=False,
)

Comment on lines +5 to +12
# Add the local keras directory to the beginning of sys.path
# to ensure we use the local codebase instead of any pip-installed keras
sys.path.insert(0, "/Users/hellorahul/Projects/keras")

# Remove any existing keras from sys.modules to force fresh import
modules_to_remove = [k for k in sys.modules.keys() if k.startswith("keras")]
for module in modules_to_remove:
del sys.modules[module]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

This test script includes a hardcoded absolute path to a user's local directory. This makes the test non-portable and will cause it to fail on any other developer's machine or in a CI environment. The sys.path manipulation should be removed.

try:
# Test imports
print("1. Testing imports...")
from keras.src.utils import pruning_utils
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

This import is incorrect. There is no pruning_utils module in keras.src.utils. The pruning utility functions are exposed under the keras.pruning namespace. This will cause an ImportError.

Suggested change
from keras.src.utils import pruning_utils
from keras.pruning import get_model_sparsity

Comment on lines +66 to +69
config = PruningConfig(sparsity=0.5, method=method_name)
initial_sparsity = get_model_sparsity(test_model)

stats = test_model.prune(config)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The test is using PruningConfig, which is not part of the public API and is inconsistent with the new model.prune() method. The prune method expects direct keyword arguments, not a config object. This will cause the test to fail.

Suggested change
config = PruningConfig(sparsity=0.5, method=method_name)
initial_sparsity = get_model_sparsity(test_model)
stats = test_model.prune(config)
# Test pruning
stats = test_model.prune(sparsity=0.5, method=method_name)

Comment on lines +137 to +146
# Test SaliencyPruning
saliency_method = SaliencyPruning(model, dummy_loss, x_sample, y_sample)
weights = model.layers[0].kernel.value
mask = saliency_method.compute_mask(weights, 0.3)
print(" ✓ SaliencyPruning instance created and mask computed")

# Test TaylorPruning
taylor_method = TaylorPruning(model, dummy_loss, x_sample, y_sample)
mask = taylor_method.compute_mask(weights, 0.3)
print(" ✓ TaylorPruning instance created and mask computed")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

SaliencyPruning and TaylorPruning are being instantiated with arguments, but their __init__ methods are empty. The required parameters (model, loss_fn, dataset) should be passed to the compute_mask method via **kwargs. This test is incorrectly implemented and will fail.

Suggested change
# Test SaliencyPruning
saliency_method = SaliencyPruning(model, dummy_loss, x_sample, y_sample)
weights = model.layers[0].kernel.value
mask = saliency_method.compute_mask(weights, 0.3)
print(" ✓ SaliencyPruning instance created and mask computed")
# Test TaylorPruning
taylor_method = TaylorPruning(model, dummy_loss, x_sample, y_sample)
mask = taylor_method.compute_mask(weights, 0.3)
print(" ✓ TaylorPruning instance created and mask computed")
# Test SaliencyPruning
saliency_method = SaliencyPruning()
weights = model.layers[0].kernel.value
mask = saliency_method.compute_mask(
weights, 0.3, model=model, dataset=(x_sample, y_sample), loss_fn=dummy_loss
)
print(" ✓ SaliencyPruning instance created and mask computed")
# Test TaylorPruning
taylor_method = TaylorPruning()
mask = taylor_method.compute_mask(
weights, 0.3, model=model, dataset=(x_sample, y_sample), loss_fn=dummy_loss
)
print(" ✓ TaylorPruning instance created and mask computed")


import keras
import numpy as np
from keras.src.pruning import complete_pruning_analysis, analyze_sparsity
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This example imports from keras.src, which is an internal API path. To demonstrate the public API correctly, imports should be made from keras.pruning.

Suggested change
from keras.src.pruning import complete_pruning_analysis, analyze_sparsity
from keras.pruning import complete_pruning_analysis, analyze_sparsity

)

print(f"\n🎯 Hidden layers only analysis...")
from keras.src.pruning import compare_sparsity, print_sparsity_report
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This import uses the internal keras.src path. It should use the public API path keras.pruning to reflect correct user-facing usage.

Suggested change
from keras.src.pruning import compare_sparsity, print_sparsity_report
from keras.pruning import compare_sparsity, print_sparsity_report

Comment on lines +13 to +16
from keras.src.pruning import PruningConfig
from keras.src.pruning import complete_pruning_analysis, analyze_sparsity, benchmark_inference
from keras.src.pruning import compare_sparsity, compare_inference_speed
from keras.src.pruning import print_sparsity_report, print_benchmark_report
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

These imports use the internal keras.src path. To demonstrate the public API, they should be changed to import from keras.pruning. Additionally, PruningConfig is not part of the public API and should not be used in this example, as the new API uses direct parameters.

Suggested change
from keras.src.pruning import PruningConfig
from keras.src.pruning import complete_pruning_analysis, analyze_sparsity, benchmark_inference
from keras.src.pruning import compare_sparsity, compare_inference_speed
from keras.src.pruning import print_sparsity_report, print_benchmark_report
from keras.pruning import complete_pruning_analysis, analyze_sparsity, benchmark_inference
from keras.pruning import compare_sparsity, compare_inference_speed
from keras.pruning import print_sparsity_report, print_benchmark_report

@codecov-commenter
Copy link

codecov-commenter commented Aug 8, 2025

Codecov Report

❌ Patch coverage is 15.84158% with 765 lines in your changes missing coverage. Please review.
✅ Project coverage is 81.69%. Comparing base (4e1b250) to head (205776c).
⚠️ Report is 8 commits behind head on master.

Files with missing lines Patch % Lines
keras/src/pruning/pruning_method.py 9.97% 361 Missing ⚠️
keras/src/pruning/pruning_utils.py 11.17% 151 Missing ⚠️
keras/src/pruning/core.py 11.11% 136 Missing ⚠️
keras/src/callbacks/pruning.py 22.36% 59 Missing ⚠️
keras/src/pruning/pruning_schedule.py 29.57% 50 Missing ⚠️
keras/src/models/model.py 11.11% 8 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master   #21561      +/-   ##
==========================================
- Coverage   82.73%   81.69%   -1.05%     
==========================================
  Files         567      573       +6     
  Lines       56464    57380     +916     
  Branches     8825     8984     +159     
==========================================
+ Hits        46718    46874     +156     
- Misses       7582     8345     +763     
+ Partials     2164     2161       -3     
Flag Coverage Δ
keras 81.50% <15.84%> (-1.05%) ⬇️
keras-jax 63.02% <15.84%> (-0.80%) ⬇️
keras-numpy 57.61% <15.84%> (-0.69%) ⬇️
keras-openvino 34.37% <15.84%> (-0.27%) ⬇️
keras-tensorflow 63.45% <15.84%> (-0.81%) ⬇️
keras-torch 63.07% <15.84%> (-0.81%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants